-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Restructured torch modules to support shapr installation without torch #393
Restructured torch modules to support shapr installation without torch #393
Conversation
…be moved from the folder it was trained in if one want to continue to train it. This is a limitation that I should consider fixing. But I am unsure how often this will occur. Also made sure that continue train works if the explanation object was trained by giving a path.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I tested this locally both with and without the torch package, and it seems to work fine.
To test it installs fine without the torch package, deleted the torch package and then installed the pacakge form this branch by devtools::install_github("LHBO/shapr",ref = "Lars/torch_module_restructure")
. I can then run e.g. the readme example. When doing devtools::document() I get errors that torch is not installed. This seems to be related to the reference to torch in the documentation. That is anyway fine, and similar to what we get trying to run an example involving torch.
When having torch installed, all checks pass locally, and I can run devtools::document() without errors.
I just checked the code briefly to understand conceptually what you did, but didn't inspect it in detail.
Please write a brief summary of what you did in the PR description, then you can just merge.
Well done!
In this PR, we refactor the
vaeac
approach so that thetorch
-modules are initiated through functions. Previously, when they were not inside functions, installation ofshapr
failed as it tried to evaluatetorch::
, which was not installed.We tried to fix this in #390, but do to technical issues for me, I had to make a new PR.
Also fixed some typos in the Roxygen documentation and ensured that the
progressr
progress bar inside thevaeac
approach is only called ifprogressr
is available.Details:
Instead of having
we replaced it with
Some extra care had to be given to
memory_layer
, which had an internal and shared environment between all instances ofmemory_layer
. In the new version, we have to create an environment first and then send this to the new version ofmemory_layer
.